[Site] Network issues

LtU's hosting provider has been experiencing severe network problems recently. Over this past weekend and today, there have been outages lasting hours at a time (which is a relative eternity, when the provider's SLA promises 99.99% uptime). The provider is now saying that problems could continue for weeks, while they upgrade their network equipment.

As it happens, a move to (yet another) new server, at a bigger and more reliable provider, has been in the works for some time, so we're going to accelerate that process and make that move as soon as possible.

Until then, if you find yourself gnawing on the corner of your desk due to a temporary inability to reach LtU (what, you mean that's just me?), look in the directory where you've saved the papers you saw here, and read one of the ones that you never got around to reading. By the time you're done, LtU will be accessible again. :)

Alloy: A Simple Structural Modeling Language Based on First-Order Logic

The Alloy Analyzer is a tool developed by the Software Design Group for analyzing models written in Alloy, a simple structural modeling language based on first-order logic. The tool can generate instances of invariants, simulate the execution of operations (even those defined implicitly), and check user-specified properties of a model. Alloy and its analyzer have been used primarily to explore abstract software designs. Its use in analyzing code for conformance to a specification and as an automatic test case generator are being investigated in ongoing research projects.

Alloy has been mentioned before, but with the recent discussions revolving around IDEs and questions about whether some kinds of checking belong in the language or in the tools surrounding the language, I thought it might be worth revisiting. In fact, it's tempting to suggest that we at LtU adopt a new category for stories: "Lightweight Formal Methods," and that we editors attempt to establish a continuum with respect to stories that fit the category. For example, Pierce makes the observation in TAPL that type systems are a particular kind of lightweight formal method, and that one of their benefits is that they're the only kind that are guaranteed to be used. Alloy falls in the "outside the language proper, but still incremental" category, and somewhere else on the spectrum you have full-blown theorem provers like Twelf, Coq, MetaPRL, et al. Does it make sense to try to unify some of the discussions about the boundary between languages and external tools under this umbrella?

Does Visual Studio Rot the Mind?

An long and interesting rant by Charles Petzold.

Obviously this is mostly about the IDE side of things (seeing as VS is an IDE).

Some of the features that VS provides are intended to overcome the huge size of the standard libraries, and you might argue this isn't really a language issue. At some level this is indeed a valid argument. However, I think we should pause every once in awhile and wonder whether better programming language abstractions might make modern programming - with GUIs, XML etc. - easier, and eliminate some of the need for huge libraries. In fact, one can argue that LINQ (Cw) is a step in this direction, as regards data access.

It is also worth noting that when IDEs influence the way programming is done, they influence the way languages are used, and thus influence the design space. Programmers demand new language features partly as a response to their experience with the language as it is used in practice.

Finally, the impact on teaching and learning programming shouldn't be overlooked. Students naturally want to produce cool GUI applications and use VS. If this makes it harder to introduce them to different programming techniques and languages (and I think it does), this can be highly problematic.

Here are some choice quotes from Petzold (who also said that the whole history of new programming languages... for Windows has involved the struggle to reduce the windows hello-world program down to something small, sleek, and elegant):

To get IntelliSense to work right, not only must you code in a bottom-up structure, but within each method or property, you must also write you code linearly from beginning to end — just as if you were using that old DOS line editor, EDLIN. You must define all variables before you use them. No more skipping around in your code.

If we select a new project type of Windows Application, for example, and give it a name and location on a local drive, Visual Studio generates sufficient code so that this project is immediately compilable and runable... Somehow, we have been persuaded that this is the proper way to program. I don’t know why. Personally, I find starting a program with an empty source code file to be very enjoyable.

This is the file in which Visual Studio inserts generated code when you design your form. Visual Studio really doesn’t want you messing around with this file, and for good reason. Visual Studio is expecting this generated code to be in a certain format, and if you mess with it, it may not be able to read it back in the next time you open the project.

This bothered me because Visual Basic was treating a program not as a complete coherent document, but as little snippets of code attached to visual objects. That’s not what a program is. That’s not what the compiler sees. How did one then get a sense of the complete program? It baffled me.

If Visual Studio really wanted you to write good code, every time you dragged a control onto your form, an annoying dialog would pop up saying “Type in a meaningful name for this control.” But Visual Studio is not interested in having you write good code. It wants you to write code fast.

New blog

A fairly recent blog that might interest some LtU readers.

It's early days yet, so I am sure the author could use some encouragement and support...

OOPSLA 2005 Reports

Follow these links.

An Overview of the Singularity Project

Singularity is a research project in Microsoft Research that started with the question: what would a software platform look like if it was designed from scratch with the primary goal of dependability? Singularity is working to answer this question by building on advances in programming languages and tools to develop a new system architecture and operating system (named Singularity), with the aim of producing a more robust and dependable software platform. Singularity demonstrates the practicality of new technologies and architectural decisions, which should lead to the construction of more robust and dependable systems...
Singularity... starts from a premise of language safety and builds a system architecture that supports and enhances the language guarantees.

An interesting overview of what sounds like an intersting project.

The choice of implementation language is also interesting:

Singularity is written in Sing#, which is an extension to the Spec# language developed in Microsoft Research. Spec# itself is an extension to Microsoft’s C# language that provides constructs (pre- and post-conditions and object invariants) for specifying program behavior. Specifications can be statically verified by the Boogie verifier or checked by compiler-inserted run-time tests. Sing# extends this language with support for channels and low-level constructs necessary for system code....integrating a feature into a language allows more aspects of a program to be verified. Singularity’s constructs allow communication to be statically verified.

An interesting aspect is the support for meta-programming, which is implemented in an unusal manner:

Compile-time reflection (CTR) is a partial substitute for the CLR’s full reflection capability. CTR is similar to techniques such as macros, binary code rewriting, aspects, meta-programming, and multi-stage languages. The basic idea is that programs may contain place-holder elements (classes, methods, fields, etc.) that are subsequently expanded by a generator.

Many other intersting design decisions are discussed in the paper (e.g., various DbC facilities), so do check it out.

Map of LtU readers

Cool new meme...

Following Estzer and Brian I have created a Frappr map for LtU readers.

Click on that link and (a) you'll have a chance to see where some of the readers are and (b) add yourself.

Finding Application Errors Using PQL: A Program Query Language

A number of effective error detection tools have been built in recent years to check if a program conforms to certain design rules. An important class of design rules deals with sequences of events associated with a set of related objects. This paper presents a language called PQL (Program Query Language) that allows programmers to express such questions easily in an application-specific context. A query looks like a code excerpt corresponding to the shortest amount of code that would violate a design rule. Details of the target application's precise implementation are abstracted away. The programmer may also specify actions to perform when a match is found, such as recording relevant information or even correcting an erroneous execution on the fly. We have developed both static and dynamic techniques to find solutions to PQL queries. Our static analyzer finds all potential matches conservatively using a context-sensitive, flow-insensitive, inclusion-based pointer alias analysis. Static results are also useful in reducing the number of instrumentation points for dynamic analysis. Our dynamic analyzer instruments the source program to catch all violations precisely as the program runs and optionally to perform user-specified actions. We have implemented techniques described in this paper and used this combination of static and dynamic analysis to successfully find 206 breaches of security and important resource leaks in 6 large real-world open-source Java applications containing a total of nearly 60,000 classes.

I saw the talk to this paper two weeks ago at OOPSLA two weeks ago and liked it very much. In a sense, I would consider it as a new application for aspect-oriented techniques.

Just What is it that Makes Martin-Lof's Type Theory so Different, so Appealing?

Martin-Löf's Type Theory (M-LTT) was developed by Per Martin-Löf in the early 1970s, as a constructive foundation for mathematics. The theory is clear and simple. Because of this it has been used for everything from program derivation to handling donkey sentences.

In this talk we will present and give a flavour of Martin-Löf's Type Theory. We will highlight the use of proof theoretical justification of the rules, without explaining this in full detail. We will present the rules for various types, emphasising the uniformity of the presentation. The types that we will present will include those needed to express the logical constants falsum, or, and, not, implies, for all, and there exists.

Yet another propositions-as-types tutorial.

It seems pretty easy to follow, and it is quite short (16 pages).

P.S

Why all the theory on LtU these days? Because the editors who keep posting are into theory. The other editors should be nudged to start posting cool hands-on stuff...

Lowering the barriers to programming

(via the LtU forum)

Lowering the barriers to programming: A taxonomy of programming environments and languages for novice programmers. Caitlin Kelleher, Randy Pausch. ACM Computing Surveys. Vol. 37. No. 2. Jun 2005.

Since the early 1960's, researchers have built a number of programming languages and environments with the intention of making programming accessible to a larger number of people. This article presents a taxonomy of languages and environments designed to make programming more accessible to novice programmers of all ages. The systems are organized by their primary goal, either to teach programming or to use programming to empower their users, and then, by each system's authors' approach, to making learning to program easier for novice programmers. The article explains all categories in the taxonomy, provides a brief description of the systems in each category, and suggests some avenues for future work in novice programming environments and languages.

You might also want to check out this older LtU item.